Stagewise Weak Gradient Pursuits Part I: Fundamentals and Numerical Studies

نویسندگان

  • Thomas Blumensath
  • Mike E. Davies
چکیده

Finding sparse solutions to underdetermined inverse problems is a fundamental challenge encountered in a wide range of signal processing applications, from signal acquisition to source separation. Recent theoretical advances in our understanding of this problem have further increased interest in their application to various domains. In many areas, such as for example medical imaging or geophysical data acquisition, it is necessary to find sparse solutions to very large underdetermined inverse problems. Fast methods have therefore to be developed. In this paper, we promote a greedy approach. In each iteration, several new elements are selected. The selected coefficients are then updated using a conjugate update direction. This is an extension of the previously suggested Gradient Pursuit framework to allow an even greedier selection strategy. A large set of numerical experiments, using artificial and real world data, demonstrate the performance of the method. It is found that the approach performs consistently better than other fast greedy approaches, such as Regularised Orthogonal Matching Pursuit and Stagewise Orthogonal Matching Pursuit and is competitive with other fast approaches, such as those based on l1 minimisation. It is also shown to have the unique property to allow a smooth trade-off between signal sparsity (or observation dimension) and computational complexity. Theoretical properties of the method are studied in a companion paper [2].

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Stagewise Weak Gradient Pursuits Part II: Theoretical Properties

In a recent paper [2] we introduced the greedy Gradient Pursuit framework. This is a family of algorithms designed to find sparse solutions to underdetermined inverse problems. One particularly powerful member of this family is the (approximate) conjugate gradient pursuit algorithm, which was shown to be applicable to very large data sets and which was found to perform nearly as well as the tra...

متن کامل

A G space theory and a weakened weak (W2) form for a unified formulation of compatible and incompatible methods: Part I theory

This paper introduces a G space theory and a weakened weak form (W2) using the generalized gradient smoothing technique for a unified formulation of a wide class of compatible and incompatible methods. The W2 formulation works for both finite element method settings and mesh-free settings, and W2 models can have special properties including softened behavior, upper bounds and ultra accuracy. Pa...

متن کامل

A Robust Boosting Method for Mislabeled Data

Abstract We propose a new, robust boosting method by using a sigmoidal function as a loss function. In deriving the method, the stagewise additive modelling methodology is blended with the gradient descent algorithms. Based on intensive numerical experiments, we show that the proposed method is actually better than AdaBoost and other regularized method in test error rates in the case of noisy, ...

متن کامل

AdaBoost and Forward Stagewise Regression are First-Order Convex Optimization Methods

Boosting methods are highly popular and effective supervised learning methods which combine weak learners into a single accurate model with good statistical performance. In this paper, we analyze two well-known boosting methods, AdaBoost and Incremental Forward Stagewise Regression (FSε), by establishing their precise connections to the Mirror Descent algorithm, which is a first-order method in...

متن کامل

A general framework for fast stagewise algorithms

Forward stagewise regression follows a very simple strategy for constructing a sequence of sparse regression estimates: it starts with all coefficients equal to zero, and iteratively updates the coefficient (by a small amount ) of the variable that achieves the maximal absolute inner product with the current residual. This procedure has an interesting connection to the lasso: under some conditi...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2008